Research Article | Open Access
Volume 2025 |Article ID 100071 | https://doi.org/10.1016/j.plaphe.2025.100071

Generation of labeled leaf point clouds for plants trait estimation

Gianmarco Roggiolani ,1 Brian N. Bailey,2 Jens Behley,1 Cyrill Stachniss1,3

1Center for Robotics, University of Bonn, Germany
2Department of Plant Science, University of Davis, United States
3Lamarr Institute for Machine Learning and Artificial Intelligence, Germany

Received 
10 Apr 2025
Accepted 
08 Jun 2025
Published
16 Jun 2025

Abstract

Today, leaf trait estimation remains a labor-intensive process. The effort to obtain ground truth measurements limits how accurately this task can be performed automatically. Traditionally, plant scientists manually measure the traits of harvested leaves and associate them with sensor data, which is key for training machine learning approaches and to automate the processes. In this paper, we propose a neural network-based method to generate synthetic 3D point clouds of leaves with their associated traits to support approaches for phenotyping. We use real-world leaf point clouds to learn how to generate realistic leaves from a leaf skeleton, which is automatically extracted. We use the generated leaves to fine-tune different leaf trait estimation methods. We evaluate our generated data using different trait estimation methods and compare the results to using real-world data or other synthetic datasets from agricultural simulation software. Experiments show that our approach generates leaf point clouds with high similarity to real-world leaves. Tuning trait estimation methods on our generated data improves their performance in the estimation of real-world leaves’ traits, making our data crucial for developing and testing data-driven trait estimation methods. Accurate trait estimation is key to understanding crop growth, productivity, and pest resistance, as leaf size directly influences photosynthesis, yield potential, and vulnerability to insects and fungal growth.

© 2019-2023   Plant Phenomics. All rights Reserved.  ISSN 2643-6515.

Back to top